首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2253篇
  免费   134篇
  国内免费   5篇
电工技术   38篇
化学工业   576篇
金属工艺   52篇
机械仪表   94篇
建筑科学   61篇
矿业工程   11篇
能源动力   71篇
轻工业   279篇
水利工程   23篇
石油天然气   19篇
无线电   173篇
一般工业技术   347篇
冶金工业   194篇
原子能技术   12篇
自动化技术   442篇
  2023年   20篇
  2022年   33篇
  2021年   97篇
  2020年   61篇
  2019年   72篇
  2018年   77篇
  2017年   83篇
  2016年   81篇
  2015年   64篇
  2014年   87篇
  2013年   189篇
  2012年   136篇
  2011年   150篇
  2010年   140篇
  2009年   138篇
  2008年   115篇
  2007年   90篇
  2006年   89篇
  2005年   69篇
  2004年   65篇
  2003年   42篇
  2002年   49篇
  2001年   36篇
  2000年   23篇
  1999年   27篇
  1998年   43篇
  1997年   34篇
  1996年   26篇
  1995年   22篇
  1994年   8篇
  1993年   23篇
  1992年   17篇
  1991年   10篇
  1990年   9篇
  1989年   9篇
  1988年   9篇
  1987年   5篇
  1985年   9篇
  1984年   9篇
  1983年   14篇
  1982年   10篇
  1981年   18篇
  1979年   5篇
  1978年   5篇
  1977年   6篇
  1976年   10篇
  1971年   6篇
  1970年   4篇
  1966年   4篇
  1936年   4篇
排序方式: 共有2392条查询结果,搜索用时 390 毫秒
991.
In depth map generation algorithms, parameters settings to yield an accurate disparity map estimation are usually chosen empirically or based on unplanned experiments. Algorithms’ performance is measured based on the distance of the algorithm results vs. the Ground Truth by Middlebury’s standards. This work shows a systematic statistical approach including exploratory data analyses on over 14000 images and designs of experiments using 31 depth maps to measure the relative influence of the parameters and to fine-tune them based on the number of bad pixels. The implemented methodology improves the performance of adaptive weight based dense depth map algorithms. As a result, the algorithm improves from 16.78 to 14.48 % bad pixels using a classical exploratory data analysis of over 14000 existing images, while using designs of computer experiments with 31 runs yielded an even better performance by lowering bad pixels from 16.78 to 13 %.  相似文献   
992.
Brazil's 2001 energy crisis monitored from space   总被引:2,自引:0,他引:2  
Data sensed by the US Air Force Defence Meteorological Satellite Program (DMSP) Operational Linescan System (OLS) during the years 2000 and 2001 in Brazil were tested as a tool to monitor reduction of nocturnal lighting. This particular timing was examined as the Brazilian population and industry were forced to reduce electric power consumption by 20% during 2001, in relation to 2000, for a period of several months, starting officially on 1 June 2001. Large urban agglomerates were compelled to switch off city lights by at least the same amount. The Distrito Federal (DF), including the Brazilian capital, Brasilia, was one of the primary areas where the government actively sought electric power consumption reductions. Using the DF as a study case, we demonstrate that the mean grey levels derived from averaging DMSP-OLS data acquired over urban centres appear to be a useful index to monitor relative oscillations in energy consumption.  相似文献   
993.
The understanding of surface energy flux is important as regards weather and climate models. Therefore, analysis of remotely sensed data, estimation of surface energy balance components (fluxes), validating experiments and results are discussed in this work. Data extracted from NOAA–AVHRR (National Oceanic and Atmospheric Administration–Advanced Very High Resolution Radiometer) satellite images were used to estimate the fluxes based on the SEBAL (Surface Energy Balance for Land) algorithm, as suggested by Bastiaanssen [Bastiaanssen, W.G.M., 1995, Regionalization of surface flux densites and moisture indicators in composite terrain. PhD thesis, Wageningen Agricultural University, The Netherlands.] To validate the results, one wide-ranging field experiment was organized near Dourados, municipality in the Brazilian state of Mato Grosso do Sul, during the summer of 1999. The experiment involved simultaneous taking of satellite images and in-situ measurements. SEBAL results are shown and discussed. The obtained average errors are less than 4%, 6% and 7% for net surface radiation, surface heat and latent flux estimations, respectively, as compared to the in-situ measurements.  相似文献   
994.
Albeit weather surveillance radar (WSR)-88D stage III radar rainfall (RR) data can generally capture the spatial variability of precipitation fields, its rainfall depth for cold seasons dominated by stratiform storms tends to be underestimated. This study proposed merging WSR-88D stage III data with rain gauge data using the Haar wavelet scheme and compared its with that merged by the statistical objective analysis (SOA) scheme. The idea is to exploit the strength of radar that better captures the spatial variability of rainfall and that of rain gauges that measure the rainfall depth more accurately. A Haar wavelet was used because of its simplicity and the appealing physical interpretation of its coefficients as directional gradients of rainfall, whose spatial correlation structure was accounted for through a polynomial function. From analysing 89 storms over the Blue River Basin (BRB), Oklahoma, during 1994–2000, the results show that the underestimation problem of WSR-88D RR was generally more pronounced during the cold season dominated by stratiform storms than warm season dominated by convective storms. The wavelet scheme was better than SOA in reducing the radar's underestimation of rainfall depths while maintaining the spatial variability of the original radar data, as shown by its merged rainfall patterns and the more accurate streamflow hydrographs simulated by a semi-distributed, physics-based rainfall-runoff model – semi-distributed physics-based hydrologic model using remote sensing (DPHM-RS) – driven by the merged data.  相似文献   
995.
Applications of L‐band SAR data to map deforestation are generally based on the assumption that undisturbed forests consistently exhibit higher radar backscatter than deforested areas. In this Letter we show that depending on the stage of the deforestation process (slashing, burning and terrain clearing), this assumption is not always valid, and deforested areas may display a stronger radar return backscatter than primary forest. The analysis of multitemporal SAR images, supported by several Landsat Thematic Mapper (TM) images and field knowledge, showed that wood materials left following the deforestation practices function as corner reflectors, causing an initial increase in the radar backscatter, which then subsequently decreases over time as the debris on these fields are removed.  相似文献   
996.
A nonlinear controller for the stabilisation of the Furuta pendulum is presented. The control strategy is based on a partial feedback linearisation. In a first stage only the actuated coordinate of the Furuta pendulum is linearised. Then, the stabilising feedback controller is obtained by applying the Lyapunov direct method. That is, using this method we prove local asymptotic stability and demonstrate that the closed-loop system has a large region of attraction. The stability analysis is carried out by means of LaSalle's invariance principle. To assess the controller effectiveness, the results of the corresponding numerical simulations are presented.  相似文献   
997.
Clearances at joints produce a loss of accuracy when positioning a mechanism. The end-effector pose error due to clearances depends on the mechanism configuration, the magnitude of the clearance itself and applied external wrenches. Sudden changes which occur in the actual posture of the mechanism owing to a change of contact mode at joints can be detected in the neighbourhood of some configuration. These sudden changes lead to positioning discontinuities on certain trajectories, or on the workspace. In this paper, a methodology for analysing the location of the discontinuities by means of a dynamic or kinetostatic analysis is presented. The advantages of choosing either the dynamic or the kinetostatic approach are analysed, making use of the 5R planar mechanism.  相似文献   
998.
999.
Most of the decision support systems for balancing industrial assembly lines are designed to report a huge number of possible line configurations, according to several criteria. In this contribution, we tackle a more realistic variant of the classical assembly line problem formulation, time and space assembly line balancing. Our goal is to study the influence of incorporating user preferences based on Nissan automotive domain knowledge to guide the multi-objective search process with two different aims. First, to reduce the number of equally preferred assembly line configurations (i.e., solutions in the decision space) according to Nissan plants requirements. Second, to only provide the plant managers with configurations of their contextual interest in the objective space (i.e., solutions within their preferred Pareto front region) based on real-world economical variables. We face the said problem with a multi-objective ant colony optimisation algorithm. Using the real data of the Nissan Pathfinder engine, a solid empirical study is carried out to obtain the most useful solutions for the decision makers in six different Nissan scenarios around the world.  相似文献   
1000.
Trusted collaborative systems require peers to be able to communicate over private, authenticated end-to-end channels. Network-layer approaches such as Virtual Private Networks (VPNs) exist, but require considerable setup and management which hinder the establishment of ad-hoc collaborative environments: trust needs to be established, cryptographic keys need to be exchanged, and private network tunnels need to be created and maintained among end users. In this paper, we propose a novel system architecture which leverages existing social infrastructures to enable ad-hoc VPNs which are self-configuring, self-managing, yet maintain security amongst trusted and untrusted third parties. The key principles of our approach are: (1) self-configuring virtual network overlays enable seamless bi-directional IP-layer connectivity to socially connected parties; (2) online social networking relationships facilitate the establishment of trust relationships among peers; and (3) both centralized and decentralized databases of social network relationships can be securely integrated into existing public-key cryptography (PKI) implementations to authenticate and encrypt end-to-end traffic flows. The main contribution of this paper is a new peer-to-peer overlay architecture that securely and autonomously creates VPN tunnels connecting social peers, where online identities and social networking relationships may be obtained from centralized infrastructures, or managed in a decentralized fashion by the peers themselves.This paper also reports on the design and performance of a prototype implementation that embodies the SocialVPN architecture. The SocialVPN router builds upon IP-over-P2P (IPOP) virtual networks and a PKI-based tunneling infrastructure, which integrates with both centralized and decentralized social networking systems including Facebook, the Drupal open-source content management system, and emailing systems with PGP support. We demonstrate our prototype’s ability to support existing, unmodified TCP/IP applications while transparently dealing with user connectivity behind Network Address Translators (NATs). We also present qualitative and quantitative analyses of functionality and performance based on wide-area network experiments using PlanetLab and Amazon EC2.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号